38 research outputs found

    Correlations of conductance peaks and transmission phases in deformed quantum dots

    Full text link
    We investigate the Coulomb blockade resonances and the phase of the transmission amplitude of a deformed ballistic quantum dot weakly coupled to leads. We show that preferred single--particle levels exist which stay close to the Fermi energy for a wide range of values of the gate voltage. These states give rise to sequences of Coulomb blockade resonances with correlated peak heights and transmission phases. The correlation of the peak heights becomes stronger with increasing temperature. The phase of the transmission amplitude shows lapses by π\pi between the resonances. Implications for recent experiments on ballistic quantum dots are discussed.Comment: 29 pages, 9 eps-figure

    Quality-Driven Disorder Handling for M-way Sliding Window Stream Joins

    Full text link
    Sliding window join is one of the most important operators for stream applications. To produce high quality join results, a stream processing system must deal with the ubiquitous disorder within input streams which is caused by network delay, asynchronous source clocks, etc. Disorder handling involves an inevitable tradeoff between the latency and the quality of produced join results. To meet different requirements of stream applications, it is desirable to provide a user-configurable result-latency vs. result-quality tradeoff. Existing disorder handling approaches either do not provide such configurability, or support only user-specified latency constraints. In this work, we advocate the idea of quality-driven disorder handling, and propose a buffer-based disorder handling approach for sliding window joins, which minimizes sizes of input-sorting buffers, thus the result latency, while respecting user-specified result-quality requirements. The core of our approach is an analytical model which directly captures the relationship between sizes of input buffers and the produced result quality. Our approach is generic. It supports m-way sliding window joins with arbitrary join conditions. Experiments on real-world and synthetic datasets show that, compared to the state of the art, our approach can reduce the result latency incurred by disorder handling by up to 95% while providing the same level of result quality.Comment: 12 pages, 11 figures, IEEE ICDE 201

    Approximate Query Answering and Result Refinement on XML Data

    Get PDF
    Today, many economic decisions are based on the fast analysis of XML data. Yet, the time to process analytical XML queries is typically high. Although current XML techniques focus on the optimization of query processing, none of these support early approximate feedback as possible in relational Online Aggregation systems. In this paper, we introduce a system that provides fast estimates to XML aggregation queries. While processing, these estimates and the assigned confidence bounds are constantly improving. In our evaluation, we show that without significantly increasing the overall execution time our system returns accurate guesses of the final answer long before traditional systems are able to produce output

    Forcasting Evolving Time Series of Energy Demand and Supply

    Get PDF
    Real-time balancing of energy demand and supply requires accurate and efficient forecasting in order to take future consumption and production into account. These balancing capabilities are reasoned by emerging energy market developments, which also pose new challenges to forecasting in the energy domain not addressed so far: First, real-time balancing requires accurate forecasts at any point in time. Second, the hierarchical market organization motivates forecasting in a distributed system environment. In this paper, we present an approach that adapts forecasting to the hierarchical organization of today’s energy markets. Furthermore, we introduce a forecasting framework, which allows efficient forecasting and forecast model maintenance of time series that evolve due to continuous streams of measurements. This framework includes model evaluation and adaptation techniques that enhance the model maintenance process by exploiting context knowledge from previous model adaptations. With this approach (1) more accurate forecasts can be produced within the same time budget, or (2) forecasts with similar accuracy can be produced in less time

    Partitioning and Multi-core Parallelization of Multi-equation Forecast Models

    Get PDF
    Forecasting is an important analysis technique used in many application domains such as electricity management, sales and retail and, traffic predictions. The employed statistical models already provide very accurate predictions, but recent developments in these domains pose new requirements on the calculation speed of the forecast models. Especially, the often used multi-equation models tend to be very complex and their estimation is very time consuming. To still allow the use of these highly accurate forecast models, it is necessary to improve the data processing capabilities of the involved data management systems. For this purpose, we introduce a partitioning approach for multi-equation forecast models that considers the specific data access pattern of these models to optimize the data storage and memory access. With the help of our approach we avoid the redundant reading of unnecessary values and improve the utilization of the CPU cache. Furthermore, we utilize the capabilities of modern multi-core hardware and parallelize the model estimation. Our experimental results on real-world data show speedups of up to 73x for the initial model estimation. Thus, our partitioning and parallelization approach significantly increases the efficiency of multi-equation models

    Efficient Integration of External Information into Forecast Models from the Energy Domain

    Get PDF
    Forecasting is an important analysis technique to support decisions and functionalities in many application domains. While the employed statistical models often provide a sufficient accuracy, recent developments pose new challenges to the forecasting process. Typically the available time for estimating the forecast models and providing accurate predictions is significantly decreasing. This is especially an issue in the energy domain, where forecast models often consider external influences to provide a high accuracy. As a result, these models exhibit a higher number of parameters, resulting in increased estimation efforts. Also, in the energy domain new measurements are constantly appended to the time series, requiring a continuous adaptation of the models to new developments. This typically involves a parameter re-estimation, which is often almost as expensive as the initial estimation, conflicting with the requirement for fast forecast computation. To address these challenges, we present a framework that allows a more efficient integration of external information. First, external information are handled in a separate model, because their linear and non-linear relationships are more stable and thus, they can be excluded from most forecast model adaptations. Second, we directly optimize the separate model using feature selection and dimension reduction techniques. Our evaluation shows that our approach allows an efficient integration of external information and thus, an increased forecasting accuracy, while reducing the re-estimation efforts

    Context-Aware Parameter Estimation for Forecast Models in the Energy Domain

    Get PDF
    Continuous balancing of energy demand and supply is a fundamental prerequisite for the stability and efficiency of energy grids. This balancing task requires accurate forecasts of future electricity consumption and production at any point in time. For this purpose, database systems need to be able to rapidly process forecasting queries and to provide accurate results in short time frames. However, time series from the electricity domain pose the challenge that measurements are constantly appended to the time series. Using a naive maintenance approach for such evolving time series would mean a re-estimation of the employed mathematical forecast model from scratch for each new measurement, which is very time consuming. We speed-up the forecast model maintenance by exploiting the particularities of electricity time series to reuse previously employed forecast models and their parameter combinations. These parameter combinations and information about the context in which they were valid are stored in a repository. We compare the current context with contexts from the repository to retrieve parameter combinations that were valid in similar contexts as starting points for further optimization. An evaluation shows that our approach improves the maintenance process especially for complex models by providing more accurate forecasts in less time than comparable estimation methods
    corecore